Information Distances versus Entropy Metric

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Information Distances versus Entropy Metric

Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years. These information distance measures are different from entropy metric, as the former is based on Kolmogorov complexity and the latter on Shannon entropy. However, for any computable probability distributions, up to a constant, the expected val...

متن کامل

Information entropy, information distances, and complexity in atoms.

Shannon information entropies in position and momentum spaces and their sum are calculated as functions of Z(2 < or = Z < or = 54) in atoms. Roothaan-Hartree-Fock electron wave functions are used. The universal property S = a + b ln Z is verified. In addition, we calculate the Kullback-Leibler relative entropy, the Jensen-Shannon divergence, Onicescu's information energy, and a complexity measu...

متن کامل

Understanding Shannon's Entropy metric for Information

Conceptually, information can be thought of as being stored in or transmitted as variables that can take on different values. A variable can be thought of as a unit of storage that can take on, at different times, one of several different specified values, following some process for taking on those values. Informally, we get information from a variable by looking at its value, just as we get in...

متن کامل

Information-theoretic Semi-supervised Metric Learning via Entropy Regularization

We propose a general information-theoretic approach to semi-supervised metric learning called SERAPH (SEmi-supervised metRic leArning Paradigm with Hypersparsity) that does not rely on the manifold assumption. Given the probability parameterized by a Mahalanobis distance, we maximize its entropy on labeled data and minimize its entropy on unlabeled data following entropy regularization. For met...

متن کامل

Duality of Metric Entropy

If K is a pre-compact subset of a normed space X and > 0, the quantity logN(K, BX) describes the complexity of K at the level of resolution . A 1972 conjecture of A. Pietsch – originally stated in the operatortheoretic laguage – asserts that the two “metric entropy functionals,” → logN(K, BX) and → logN(BX∗ , K◦), are equivalent in the appropriate sense, uniformly over normed spaces X and over ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Entropy

سال: 2017

ISSN: 1099-4300

DOI: 10.3390/e19060260